The Group Dantzig Selector

نویسندگان

  • Han Liu
  • Jian Zhang
  • Xiaoye Jiang
  • Jun Liu
چکیده

We introduce a new method — the group Dantzig selector — for high dimensional sparse regression with group structure, which has a convincing theory about why utilizing the group structure can be beneficial. Under a group restricted isometry condition, we obtain a significantly improved nonasymptotic `2-norm bound over the basis pursuit or the Dantzig selector which ignores the group structure. To gain more insight, we also introduce a surprisingly simple and intuitive sparsity oracle condition to obtain a block `1norm bound, which is easily accessible to a broad audience in machine learning community. Encouraging numerical results are also provided to support our theory.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

DASSO: Connections Between the Dantzig Selector and Lasso

We propose a new algorithm, DASSO, for fitting the entire coefficient path of the Dantzig selector with a similar computational cost to the LARS algorithm that is used to compute the Lasso. DASSO efficiently constructs a piecewise linear path through a sequential simplex-like algorithm, which is remarkably similar to LARS. Comparison of the two algorithms sheds new light on the question of how ...

متن کامل

Dantzig selector homotopy with dynamic measurements

The Dantzig selector is a near ideal estimator for recovery of sparse signals from linear measurements in the presence of noise. It is a convex optimization problem which can be recast into a linear program (LP) for real data, and solved using some LP solver. In this paper we present an alternative approach to solve the Dantzig selector which we call “Primal Dual pursuit” or “PD pursuit”. It is...

متن کامل

The Double Dantzig

The Dantzig selector (Candes and Tao, 2007) is a new approach that has been proposed for performing variable selection and model fitting on linear regression models. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso and Dantzig selector potentially do a good job of selecting the correct variables, several researcher...

متن کامل

Parallelism, uniqueness, and large-sample asymptotics for the Dantzig selector.

The Dantzig selector (Candès and Tao, 2007) is a popular ℓ1-regularization method for variable selection and estimation in linear regression. We present a very weak geometric condition on the observed predictors which is related to parallelism and, when satisfied, ensures the uniqueness of Dantzig selector estimators. The condition holds with probability 1, if the predictors are drawn from a co...

متن کامل

A Generalized Dantzig Selector with Shrinkage Tuning

The Dantzig selector performs variable selection and model fitting in linear regression. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso and Dantzig selector potentially do a good job of selecting the correct variables, they tend to over-shrink the final coefficients. This results in an unfortunate trade-off. One ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010